-
Notifications
You must be signed in to change notification settings - Fork 18.7k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Allow dynamic batch sizes in all the layers #195
Conversation
@@ -18,8 +18,10 @@ class Blob { | |||
explicit Blob(const int num, const int channels, const int height, | |||
const int width); | |||
virtual ~Blob() {} | |||
void Reshape(const int num, const int height, | |||
const int width, const int channels); | |||
void Reshape(const int num, const int channels, const int height, |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Thank you for making this consistent with the internal indexing by N x K x H x W.
This is welcome flexibility! Please update us when tests pass so that we can assign a reviewer. Thanks. |
void Reshape(const int num, const int channels, const int height, | ||
const int width); | ||
void ReshapeBigEnough(const int num, const int channels, const int height, | ||
const int width); |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Add comments for this function?
This needs to be revised to be consistent with #250. |
In practical applications, many users need to feed data of varying batch sizes into the network. The detailed design of this PR was formed in the discussion of #119.
#189 may involved similar but more complex blob memory operations, but this PR only focuses on batch size.
Many but not all the layers have passed the dynamic batch sizes test cases right now.